Admissible Predictive Density Estimation

نویسندگان

  • LAWRENCE D. BROWN
  • EDWARD I. GEORGE
  • XINYI XU
چکیده

Let X|μ ∼ Np(μ,vxI ) and Y |μ ∼ Np(μ,vyI ) be independent pdimensional multivariate normal vectors with common unknown mean μ. Based on observing X = x, we consider the problem of estimating the true predictive density p(y|μ) of Y under expected Kullback–Leibler loss. Our focus here is the characterization of admissible procedures for this problem. We show that the class of all generalized Bayes rules is a complete class, and that the easily interpretable conditions of Brown and Hwang [Statistical Decision Theory and Related Topics (1982) III 205–230] are sufficient for a formal Bayes rule to be admissible.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Admissible Estimators of ?r in the Gamma Distribution with Truncated Parameter Space

In this paper, we consider admissible estimation of the parameter ?r in the gamma distribution with truncated parameter space under entropy loss function. We obtain the classes of admissible estimators. The result can be applied to estimation of parameters in the normal, lognormal, pareto, generalized gamma, generalized Laplace and other distributions.

متن کامل

Exact Minimax Estimation of the Predictive Density in Sparse Gaussian Models.

We consider estimating the predictive density under Kullback-Leibler loss in an ℓ0 sparse Gaussian sequence model. Explicit expressions of the first order minimax risk along with its exact constant, asymptotically least favorable priors and optimal predictive density estimates are derived. Compared to the sparse recovery results involving point estimation of the normal mean, new decision theore...

متن کامل

Estimation of Lower Bounded Scale Parameter of Rescaled F-distribution under Entropy Loss Function

We consider the problem of estimating the scale parameter &beta of a rescaled F-distribution when &beta has a lower bounded constraint of the form &beta&gea, under the entropy loss function. An admissible minimax estimator of the scale parameter &beta, which is the pointwise limit of a sequence of Bayes estimators, is given. Also in the class of truncated linear estimators, the admissible estim...

متن کامل

2 01 2 on the within - Family Kullback - Leibler Risk in Gaussian Predictive Models

We consider estimating the predictive density under KullbackLeibler loss in a high-dimensional Gaussian model. Decision theoretic properties of the within-family prediction error – the minimal risk among estimates in the class G of all Gaussian densities are discussed. We show that in sparse models, the class G is minimax suboptimal. We produce asymptotically sharp upper and lower bounds on the...

متن کامل

Exact Minimax Predictive Density Estimation and MDL

The problems of predictive density estimation with Kullback-Leibler loss, optimal universal data compression for MDL model selection, and the choice of priors for Bayes factors in model selection are interrelated. Research in recent years has identified procedures which are minimax for risk in predictive density estimation and for redundancy in universal data compression. Here, after reviewing ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008